200 research outputs found

    An efficient, approximate path-following algorithm for elastic net based nonlinear spike enhancement

    Get PDF
    Unwanted spike noise in a digital signal is a common problem in digital filtering. However, sometimes the spikes are wanted and other, superimposed, signals are unwanted, and linear, time invariant (LTI) filtering is ineffective because the spikes are wideband - overlapping with independent noise in the frequency domain. So, no LTI filter can separate them, necessitating nonlinear filtering. However, there are applications in which the noise includes drift or smooth signals for which LTI filters are ideal. We describe a nonlinear filter formulated as the solution to an elastic net regularization problem, which attenuates band-limited signals and independent noise, while enhancing superimposed spikes. Making use of known analytic solutions a novel, approximate path-following algorithm is given that provides a good, filtered output with reduced computational effort by comparison to standard convex optimization methods. Accurate performance is shown on real, noisy electrophysiological recordings of neural spikes

    Objective dysphonia quantification in vocal fold paralysis: comparing nonlinear with classical measures

    Get PDF
    Clinical acoustic voice recording analysis is usually performed using classical perturbation measures including jitter, shimmer and noise-to-harmonic ratios. However, restrictive mathematical limitations of these measures prevent analysis for severely dysphonic voices. Previous studies of alternative nonlinear random measures addressed wide varieties of vocal pathologies. Here, we analyze a single vocal pathology cohort, testing the performance of these alternative measures alongside classical measures.

We present voice analysis pre- and post-operatively in unilateral vocal fold paralysis (UVFP) patients and healthy controls, patients undergoing standard medialisation thyroplasty surgery, using jitter, shimmer and noise-to-harmonic ratio (NHR), and nonlinear recurrence period density entropy (RPDE), detrended fluctuation analysis (DFA) and correlation dimension. Systematizing the preparative editing of the recordings, we found that the novel measures were more stable and hence reliable, than the classical measures, on healthy controls.

RPDE and jitter are sensitive to improvements pre- to post-operation. Shimmer, NHR and DFA showed no significant change (p > 0.05). All measures detect statistically significant and clinically important differences between controls and patients, both treated and untreated (p < 0.001, AUC > 0.7). Pre- to post-operation, GRBAS ratings show statistically significant and clinically important improvement in overall dysphonia grade (G) (AUC = 0.946, p < 0.001).

Re-calculating AUCs from other study data, we compare these results in terms of clinical importance. We conclude that, when preparative editing is systematized, nonlinear random measures may be useful UVFP treatment effectiveness monitoring tools, and there may be applications for other forms of dysphonia.
&#xa

    Polymorphic dynamic programming by algebraic shortcut fusion

    Full text link
    Dynamic programming (DP) is a broadly applicable algorithmic design paradigm for the efficient, exact solution of otherwise intractable, combinatorial problems. However, the design of such algorithms is often presented informally in an ad-hoc manner, and as a result is often difficult to apply correctly. In this paper, we present a rigorous algebraic formalism for systematically deriving novel DP algorithms, either from existing DP algorithms or from simple functional recurrences. These derivations lead to algorithms which are provably correct and polymorphic over any semiring, which means that they can be applied to the full scope of combinatorial problems expressible in terms of semirings. This includes, for example: optimization, optimal probability and Viterbi decoding, probabilistic marginalization, logical inference, fuzzy sets, differentiable softmax, and relational and provenance queries. The approach, building on many ideas from the existing literature on constructive algorithmics, exploits generic properties of (semiring) polymorphic functions, tupling and formal sums (lifting), and algebraic simplifications arising from constraint algebras. We demonstrate the effectiveness of this formalism for some example applications arising in signal processing, bioinformatics and reliability engineering.Comment: Updated v9 with 2 additional figures and description

    A Comparative Study of the Magnitude, Frequency and Distribution of Intense Rainfall in the United Kingdom

    Get PDF
    During the 1960s, a study was made of the magnitude, frequency and distribution of intense rainfall over the UK, employing data from more than 120 daily-read rain gauges covering the period 1911 to 1960. Using the same methodology, that study was recently updated utilizing data for the period 1961 to 2006 for the same gauges, or from those nearby. This paper describes the techniques applied to ensure consistency of data and statistical modelling. It presents a comparison of patterns of extreme rainfalls for the two periods and discusses the changes that have taken place. Most noticeably, increases up to 20% have occurred in the north west of the country and in parts of East Anglia. There have also been changes in other areas, including decreases of the same magnitude over central England. The implications of these changes are considered

    Machine learning for large-scale wearable sensor data in Parkinson disease:concepts, promises, pitfalls, and futures

    Get PDF
    For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice

    Probabilistic Modelling for Unsupervised Analysis of Human Behaviour in Smart Cities

    Get PDF
    The growth of urban areas in recent years has motivated a large amount of new sensor applications in smart cities. At the centre of many new applications stands the goal of gaining insights into human activity. Scalable monitoring of urban environments can facilitate better informed city planning, efficient security, regular transport and commerce. A large part of monitoring capabilities have already been deployed; however, most rely on expensive motion imagery and privacy invading video cameras. It is possible to use a low-cost sensor alternative, which enables deep understanding of population behaviour such as the Global Positioning System (GPS) data. However, the automated analysis of such low dimensional sensor data, requires new flexible and structured techniques that can describe the generative distribution and time dynamics of the observation data, while accounting for external contextual influences such as time of day or the difference between weekend/weekday trends. In this paper, we propose a novel time series analysis technique that allows for multiple different transition matrices depending on the data’s contextual realisations all following shared adaptive observational models that govern the global distribution of the data given a latent sequence. The proposed approach, which we name Adaptive Input Hidden Markov model (AI-HMM) is tested on two datasets from different sensor types: GPS trajectories of taxis and derived vehicle counts in populated areas. We demonstrate that our model can group different categories of behavioural trends and identify time specific anomalies
    corecore